Convergence Rates for Newton’s Method at Singular Points
نویسندگان
چکیده
منابع مشابه
Convergence of the Mrv Method at Singular Points 1
In this paper we give sufficient conditions for convergence of the Newton-like method with modification of the right-hand-side vector (MRV) for a class of singular problems. The rate of convergence is sublinear. Numerical results are included witch agree well with the theoretically proven results. AMS Mathematics Subject Classification (2000): 65H10
متن کاملAsymptotics at irregular singular points
• Introduction 1. Example: rotationally symmetric eigenfunctions on R 2. Example: translation-equivariant eigenfunctions on H 3. Beginning of construction of solutions 4. K(x, t) is bounded 5. End of construction of solutions 6. Asymptotics of solutions 7. Appendix: asymptotic expansions • Bibliography According to [Erdélyi 1956], Thomé [1] found that differential equations with finite rank irr...
متن کاملOn convergence of certain nonlinear Durrmeyer operators at Lebesgue points
The aim of this paper is to study the behaviour of certain sequence of nonlinear Durrmeyer operators $ND_{n}f$ of the form $$(ND_{n}f)(x)=intlimits_{0}^{1}K_{n}left( x,t,fleft( tright) right) dt,,,0leq xleq 1,,,,,,nin mathbb{N}, $$ acting on bounded functions on an interval $left[ 0,1right] ,$ where $% K_{n}left( x,t,uright) $ satisfies some suitable assumptions. Here we estimate the rate...
متن کاملConvergence analysis of product integration method for nonlinear weakly singular Volterra-Fredholm integral equations
In this paper, we studied the numerical solution of nonlinear weakly singular Volterra-Fredholm integral equations by using the product integration method. Also, we shall study the convergence behavior of a fully discrete version of a product integration method for numerical solution of the nonlinear Volterra-Fredholm integral equations. The reliability and efficiency of the proposed scheme are...
متن کاملAn Improved Gauss-Newtons Method based Back-propagation Algorithm for Fast Convergence
The present work deals with an improved back-propagation algorithm based on Gauss-Newton numerical optimization method for fast convergence. The steepest descent method is used for the back-propagation. The algorithm is tested using various datasets and compared with the steepest descent back-propagation algorithm. In the system, optimization is carried out using multilayer neural network. The ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SIAM Journal on Numerical Analysis
سال: 1983
ISSN: 0036-1429,1095-7170
DOI: 10.1137/0720020